13 research outputs found
Metamodel-based importance sampling for the simulation of rare events
In the field of structural reliability, the Monte-Carlo estimator is
considered as the reference probability estimator. However, it is still
untractable for real engineering cases since it requires a high number of runs
of the model. In order to reduce the number of computer experiments, many other
approaches known as reliability methods have been proposed. A certain approach
consists in replacing the original experiment by a surrogate which is much
faster to evaluate. Nevertheless, it is often difficult (or even impossible) to
quantify the error made by this substitution. In this paper an alternative
approach is developed. It takes advantage of the kriging meta-modeling and
importance sampling techniques. The proposed alternative estimator is finally
applied to a finite element based structural reliability analysis.Comment: 8 pages, 3 figures, 1 table. Preprint submitted to ICASP11
Mini-symposia entitled "Meta-models/surrogate models for uncertainty
propagation, sensitivity and reliability analysis
Metamodel-based importance sampling for structural reliability analysis
Structural reliability methods aim at computing the probability of failure of
systems with respect to some prescribed performance functions. In modern
engineering such functions usually resort to running an expensive-to-evaluate
computational model (e.g. a finite element model). In this respect simulation
methods, which may require runs cannot be used directly. Surrogate
models such as quadratic response surfaces, polynomial chaos expansions or
kriging (which are built from a limited number of runs of the original model)
are then introduced as a substitute of the original model to cope with the
computational cost. In practice it is almost impossible to quantify the error
made by this substitution though. In this paper we propose to use a kriging
surrogate of the performance function as a means to build a quasi-optimal
importance sampling density. The probability of failure is eventually obtained
as the product of an augmented probability computed by substituting the
meta-model for the original performance function and a correction term which
ensures that there is no bias in the estimation even if the meta-model is not
fully accurate. The approach is applied to analytical and finite element
reliability problems and proves efficient up to 100 random variables.Comment: 20 pages, 7 figures, 2 tables. Preprint submitted to Probabilistic
Engineering Mechanic
Sequential design of computer experiments for the estimation of a probability of failure
This paper deals with the problem of estimating the volume of the excursion
set of a function above a given threshold,
under a probability measure on that is assumed to be known. In
the industrial world, this corresponds to the problem of estimating a
probability of failure of a system. When only an expensive-to-simulate model of
the system is available, the budget for simulations is usually severely limited
and therefore classical Monte Carlo methods ought to be avoided. One of the
main contributions of this article is to derive SUR (stepwise uncertainty
reduction) strategies from a Bayesian-theoretic formulation of the problem of
estimating a probability of failure. These sequential strategies use a Gaussian
process model of and aim at performing evaluations of as efficiently as
possible to infer the value of the probability of failure. We compare these
strategies to other strategies also based on a Gaussian process model for
estimating a probability of failure.Comment: This is an author-generated postprint version. The published version
is available at http://www.springerlink.co
Risk factors for childhood overweight in 6- to 7-y-old Hong Kong children
OBJECTIVE: To identify risk factors for overweight in Hong Kong children aged 6-7 y. DESIGN: Case-control study. SETTING: Student Health Service Centres, Hong Kong. SUBJECTS: A total of 343 Hong Kong Chinese children aged 6-7 y old categorised into three groups, an overweight group (≥ 92nd centile for BMI), a normal middle-weight group (45th-55th centile for BMI) and a normal low-weight group (≤ 8th centile for BMI). MEASUREMENTS: Subjects and their parents/caregivers were interviewed at home. Data on lifestyle habits, dietary habits, family structure and demographic background were collected by questionnaire. A 3-day dietary record was administrated by the parents/caregivers to assess dietary intake of the children. RESULTS: Logistic regression analyses (overweight group compared with middle-weight plus low-weight groups) showed that childhood overweight was significantly associated with parental obesity (BMI ≥ 25 kg/m2, Asian reference) (paternal: OR = 2.66, 95% CI = 1.51-4.70; maternal: 5.07, 2.62-9.79) but not parental overweight (BMI = 23-25 kg/m2). After adjustment for parental obesity, the odds ratio for childhood overweight was increased by birth weight (< 3.0 kg as reference, 3.0-3.5 kg: 2.13, 1.18-3.84; ≥ 3.5 kg: 4.89, 2.49-9.60) and decreased by sleeping duration (< 9 h/day as reference, 9-11 h/day: 0.54, 0.30-0.97; ≥ 11 h/day: 0.31, 0.11-0.87). Childhood overweight was also significantly associated with higher energy consumption (2.62, 1.20-5.74) and having a father who was a current smoker (2.08, 1.25-3.46). CONCLUSIONS: Although healthy diet and regular exercise will remain the cornerstones of obesity management in children, our data support the view that education about maintaining a healthy weight could be introduced much earlier in those families with high-risk children, as indicated by high parental BMI or high birth weight. The utility and practicality of such an approach should be carefully evaluated before becoming part of any public health policy. Further study of the role of short sleeping duration and parental smoking on childhood obesity development is warranted.link_to_subscribed_fulltex